Goto

Collaborating Authors

 nyi divergence analysis


Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

Neural Information Processing Systems

Various differentially private algorithms instantiate the exponential mechanism, and require sampling from the distribution $\exp(-f)$ for a suitable function $f$. When the domain of the distribution is high-dimensional, this sampling can be challenging. Using heuristic sampling schemes such as Gibbs sampling does not necessarily lead to provable privacy. When $f$ is convex, techniques from log-concave sampling lead to polynomial-time algorithms, albeit with large polynomials. Langevin dynamics-based algorithms offer much faster alternatives under some distance measures such as statistical distance. In this work, we establish rapid convergence for these algorithms under distance measures more suitable for differential privacy. For smooth, strongly-convex $f$, we give the first results proving convergence in R\'enyi divergence. This gives us fast differentially private algorithms for such $f$. Our techniques and simple and generic and apply also to underdamped Langevin dynamics.


Review for NeurIPS paper: Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

Neural Information Processing Systems

Despite hinting at such a result multiple times in the paper, the results presented in this paper does not directly imply pure or approximate differential privacy for an algorithm that runs Langevin dynamics for T-iterations. At least it is not a trivial argument that goes through without further assumptions. The reason is the following: The Renyi Divergence bound on D(P R) (the order \alpha is abbreviated for readability) does not seem to imply a differential privacy bound overall, even though a sample from R satisfies DP. The DP bound of posterior sampling implies a bound on D(R R'). By results of this paper, we have bounds on D(P R) and D(P' R').


Review for NeurIPS paper: Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

Neural Information Processing Systems

The reviewers agree that this paper provides an interesting analysis on the Langevin dynamics, which has interesting implications to differential privacy. The presentation is clear and the technical results are novel. The paper should clarify whether the finite-time variant of the dynamics actually leads to a private algorithm in their revision.


Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

Neural Information Processing Systems

Various differentially private algorithms instantiate the exponential mechanism, and require sampling from the distribution \exp(-f) for a suitable function f . When the domain of the distribution is high-dimensional, this sampling can be challenging. Using heuristic sampling schemes such as Gibbs sampling does not necessarily lead to provable privacy. When f is convex, techniques from log-concave sampling lead to polynomial-time algorithms, albeit with large polynomials. Langevin dynamics-based algorithms offer much faster alternatives under some distance measures such as statistical distance.